Linear classifier design under heteroscedasticity in Linear Discriminant Analysis

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Linear classifier design under heteroscedasticity in Linear Discriminant Analysis

Under normality and homoscedasticity assumptions, Linear Discriminant Analysis (LDA) is known to be optimal in terms of minimising the Bayes error for binary classification. In the heteroscedastic case, LDA is not guaranteed to minimise this error. Assuming heteroscedasticity, we derive a linear classifier, the Gaussian Linear Discriminant (GLD), that directly minimises the Bayes error for bina...

متن کامل

Fisher Linear Discriminant Analysis

Fisher Linear Discriminant Analysis (also called Linear Discriminant Analysis(LDA)) are methods used in statistics, pattern recognition and machine learning to find a linear combination of features which characterizes or separates two or more classes of objects or events. The resulting combination may be used as a linear classifier, or, more commonly, for dimensionality reduction before later c...

متن کامل

Separable linear discriminant analysis

Linear discriminant analysis (LDA) is a popular technique for supervised dimension reduction. Due to the curse of dimensionality usually suffered by LDA when applied to 2D data, several two-dimensional LDA (2DLDA) methods have been proposed in recent years. Among which, the Y2DLDA method, introduced by Ye et al. (2005), is an important development. The idea is to utilize the underlying 2D data ...

متن کامل

Geometric linear discriminant analysis

When it becomes necessary to reduce the complexity of a classifier, dimensionality reduction can be an effective way to address classifier complexity. Linear Discriminant Analysis (LDA) is one approach to dimensionality reduction that makes use of a linear transformation matrix. The widely used Fisher’s LDA is “sub-optimal” when the sample class covariance matrices are unequal, meaning that ano...

متن کامل

Linear Discriminant Analysis Algorithms

We propose new algorithms for computing linear discriminants to perform data dimensionality reduction from R to R, with p < n. We propose alternatives to the classical Fisher’s Distance criterion, namely, we investigate new criterions based on the: Chernoff-Distance, J-Divergence and Kullback-Leibler Divergence. The optimization problems that emerge of using these alternative criteria are non-c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Expert Systems with Applications

سال: 2017

ISSN: 0957-4174

DOI: 10.1016/j.eswa.2017.02.039